DiscriminationBetween Gamma andWeibull Distribution Based on Kullback-LeiblerDivergence
نویسندگان
چکیده
منابع مشابه
Kullback-Leibler Divergence for the Normal-Gamma Distribution
We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.
متن کاملSpeech Probability Distribution based on Generalized Gamma Distribution
In this paper, we propose a new speech probability distribution, two-sided generalized gamma distribution (GΓD) for an efficient parametric characterization of speech spectra. GΓD forms a generalized class of parametric distributions including the Gaussian, Laplacian and Gamma probability density functions (pdf’s) as special cases. All the parameters associated with the GΓD are estimated by the...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملClustering on Uncertain Data using Kullback Leibler Divergence Measurement based on Probability Distribution
Cluster analysis is one of the important data analysis methods and is a very complex task. It is the art of a detecting group of similar objects in large data sets without requiring specified groups by means of explicit features or knowledge of data. Clustering on uncertain data is a most difficult task in both modeling similarity between uncertain data objects and developing efficient computat...
متن کاملModel Averaging Based on Kullback-leibler Distance.
This paper proposes a model averaging method based on Kullback-Leibler distance under a homoscedastic normal error term. The resulting model average estimator is proved to be asymptotically optimal. When combining least squares estimators, the model average estimator is shown to have the same large sample properties as the Mallows model average (MMA) estimator developed by Hansen (2007). We sho...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Afyon Kocatepe University Journal of Sciences and Engineering
سال: 2017
ISSN: 2147-5296,2149-3367
DOI: 10.5578/fmbd.60774